[00] Sensing Through the Body: Glove for AR Interaction
2025 - 2026 Fall
Academic + Group Work
Massachusetts Institute of Technology - 3.173 Computing Fabrics
Collaborators: Clara Emmerling
Human–machine interfaces (HMIs) are increasingly designed to incorporate sensing and haptic output, creating more intuitive and immersive interactions across fields such as healthcare, robotics, and AR/VR. Developments in soft electronics are accelerating this shift, enabling interactions that move beyond rigid devices toward soft, wearable systems.
Sensing capabilities can be seamlessly integrated into textile-based wearables through digital machine knitting and the incorporation of functional materials. This project focuses on designing a system with an integrated Neural Network fiber for an astronaut’s inner glove which can provide gesture detection for complex Augmented Reality missions in lunar environments. The outer glove makes it significantly difficult for performing finger moving gestures or contacting to other surfaces of interactions like buttons. By integrating gesture detection the need for providing a contact surface is eliminated.
Astronauts Outer Glove:
Thick insulated shell, difficult to use
delicate instruments + equipmentAstronauts Inner Glove:
Smart + gesture-detecting layer used
for navigation and communication?
The main design concept for the glove was to make it modular and adaptable for different use scenarios and users as there is a need for customization.
system design diagram
design idea and implementation:
modularizing the glove [01], schematic glove system [02]
design - 1
schematic & prototype
design - 2schematic & prototype
knit prototyping
on domestic single bed silvereed
second [02] third [03]
stitch pattern generation for complete glove garment
knit prototyping stages
insertion of the NN fiber [t1]
and carrying it through the glove skin [t2]
in-fiber neural network (NN)
the embedded fiber has an accelerometer that is trained for gesture recognition of different behaviors. the two designs had different gestures for recognition. design 1 gestures were curl [1] and idle [2]. whereas for design 2 the gestures were curl [1], idle [2] and right swipe [3].
training set strategy
about 60 samples per gesture class
samples with different time ranges (from 0.5 to 4s)
design - 1 metrics
design - 2 metrics
electronics scheme diagram for embedded pcb
AR interface used with the Sensing Gloveused software: Unity
gesture legend: curling [create element], right swipe [delete element]
system development for intelligent navigation in lunar environments: AR interface prototype designed by Qilmeg Doudatcz
extra +
diverse patterns that could be incorporated in the glove modules for diverse haptics, texture library and tactility
The samples showcase differrent patterns that introduce 3D extrustions, some introduce local climaxes within regions of pattern introduction whereas some introduce more homogenous textures throughout the samples. These are couple examples that have just 3 different needle actions executed on the Silvereed manual knitting machine with DesignaKnit software.